Skip to content

Allow distribute scala 2.12 and update to spark 2.4.3 #1308

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 11 commits into from

Conversation

neoramon
Copy link

What do you think about distributing Elasticsearch spark" in Scala 2.12 as well?

@ghost
Copy link

ghost commented Jun 22, 2019

Hi @neoramon, we have found your signature in our records, but it seems like you have signed with a different e-mail than the one used in your Git commit. Can you please add both of these e-mails into your Github profile (they can be hidden), so we can match your e-mails to your Github profile?

@neoramon
Copy link
Author

@jbaiera
Copy link
Member

jbaiera commented Aug 23, 2019

Sorry for the wait on this issue folks. I'll look into getting this tested and reviewed soon. The way we cross compile Scala in the build has been up in the air for a while due to some external forces on the project (mostly needing to upgrade our version of Gradle, but that upgrade broke our Scala cross compile stuff). Now that the concerns around our compile logic are clearing up, we can start looking at this in earnest.

@EmergentOrder
Copy link

@jbaiera Any update here?

# same as Spark's
scala210Version = 2.10.7
scala210MajorVersion = 2.10
scala211Version = 2.11.12
scala211MajorVersion = 2.11
scala212Version = 2.12.8
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

2.12.10 is the most recent

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok, I will change.

@f1yegor
Copy link

f1yegor commented Nov 25, 2019

@neoramon @jbaiera any update on the issue? do you need any help?
current MR has a conflict that is easy to merge into master.

I will build with 2.12 locally because I depend on it for my project, feel free to ask for some fixes to end it in master.

@neoramon
Copy link
Author

Done

@fistan684
Copy link

Any updates? Looks like it's almost there. :)

@neoramon
Copy link
Author

At least Java 11 is required to use elasticsearch gradle tools

It seems that Travis doesn't have Java 11 config.

@f1yegor
Copy link

f1yegor commented Dec 10, 2019

At least Java 11 is required to use elasticsearch gradle tools

It seems that Travis doesn't have Java 11 config.

I would ask the legitimacy of that move (java 8 -> java 11) for library build, up to my knowledge spark(2.4, I haven't dig into 3.0 topic yet) doesn't explicitly support java 11.

Sorry, when I reread a thread I got your point. So with the above point, I mentioned the discrepancy.
Should we somehow separate two concerns(library dependency vs gradle build tools dependency), I don't know.
In our project we by mistake used java 11, and when I wrote first unit tests they failed because of it.

@neoramon
Copy link
Author

At least Java 11 is required to use elasticsearch gradle tools

It seems that Travis doesn't have Java 11 config.

I would ask the legitimacy of that move (java 8 -> java 11) for build, up to my knowledge spark(2.4, i haven't dig into 3.0 topic yet) doesn't explicitly support java 11.

I'm not sure why this build needs java 11, there is a validation on build for Java >= 11.

buildSrc/build.gradle
if (JavaVersion.current() < JavaVersion.VERSION_11) {
    throw new GradleException('At least Java 11 is required to use elasticsearch gradle tools')
}

At the end, it also needs Java 8 for hadoop.
And I don't know how to set two jdk in travis.

:(

@jbaiera
Copy link
Member

jbaiera commented Dec 10, 2019

@neoramon I share your confusion on the Travis CI situation. I've tried a number of configurations and none of them have panned out. We are working on changing the CI for the project to use the public Jenkins builds for the CI checks instead.

As for the reasoning behind the move to Java 11+ for the build: We use the Elasticsearch project Gradle build tools to stand up clusters for our testing. In order to stay in sync with the versions of the build tools (which only support Java 12), we need the build to run on that same version of Java. That said, we still need Java 8 in an environment variable in order to compile Scala with the correct versions.

@cdmitri
Copy link

cdmitri commented Jan 1, 2020

👍 @neoramon @jbaiera - thank you for working on this!

I've compiled on jdk8 and scala 2.12.10 with basic app tests passing. Just the src code changes (I was half way though before I found this PR ;)

Dataproc on GCP image preview is 2.12 now, so this is becoming more critical - adding my +1 here

@neoramon - thank you for the PR!

@rlmark
Copy link

rlmark commented Feb 5, 2020

Hi there! Any updates on this? Is there any way to help? Thanks for your work on this issue so far! 😄

@cdmitri
Copy link

cdmitri commented Feb 6, 2020

Hi there! Any updates on this? Is there any way to help? Thanks for your work on this issue so far! 😄

@rlmark - I'm watching this as well, here is the latest relevant comment: #1412 (comment)

@jbaiera
Copy link
Member

jbaiera commented Feb 7, 2020

Hi all, a quick update here: I've logged an issue on the project pertaining to our growing pains with building Scala. This PR will most likely need to wait for that issue to be resolved before it can be moved any further.

@Habitats
Copy link

Habitats commented Apr 9, 2020

Do you have a rough timeline for when you think these things will happen @jbaiera?

And, is it possible to hack this together by building some of the parts myself?

@rex-remind101
Copy link

Any updates on this?

@rex-remind101
Copy link

@neoramon i'm trying to build your fork locally https://github.com/neoramon/elasticsearch-hadoop and it seems to fail with

> Task :buildSrc:compileGroovy FAILED

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':buildSrc:compileGroovy'.
> Compilation failed; see the compiler error output for details.

please help?

tomdubiel added a commit to MicroFocus/elasticsearch-hadoop that referenced this pull request Aug 11, 2020
- modelled from elastic#1308
- added a script to install it locally for my mediocre test purposes
fnour pushed a commit to MicroFocus/elasticsearch-hadoop that referenced this pull request Aug 11, 2020
- modelled from elastic#1308
- added a script to install it locally for my mediocre test purposes
@ktakanopy
Copy link

Any updates on this? /2

@neoramon
Copy link
Author

This is PR is old now and it will need an update from master.
Someone know how to fix the problem of running Travis with two jdk?

@neoramon
Copy link
Author

I just did merge with master and saw that travis file was removed, and it looks that all checks have passed! 👍

@ktakanopy
Copy link

Ccan someone review and merge?

@adamczarny
Copy link

Is there any news on when it'll be merged?

# same as Spark's
scala210Version = 2.10.7
scala210MajorVersion = 2.10
scala211Version = 2.11.12
scala211MajorVersion = 2.11
scala212Version = 2.12.10

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Latest Scala 2.12 version is now 2.12.12

@neoramon
Copy link
Author

I'm closing this PR.
@jbaiera already updated spark and scala version in this PR: https://github.com/elastic/elasticsearch-hadoop/pull/1551/files

@neoramon neoramon closed this Nov 19, 2020
@vishnugs
Copy link

Even im facing this issue with Spark 3.0 and no option to use scala 2.11 as spark supports min 2.12 onwards from 3.0.0

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.